5,979 research outputs found

    Concept-Oriented Deep Learning with Large Language Models

    Full text link
    Large Language Models (LLMs) have been successfully used in many natural-language tasks and applications including text generation and AI chatbots. They also are a promising new technology for concept-oriented deep learning (CODL). However, the prerequisite is that LLMs understand concepts and ensure conceptual consistency. We discuss these in this paper, as well as major uses of LLMs for CODL including concept extraction from text, concept graph extraction from text, and concept learning. Human knowledge consists of both symbolic (conceptual) knowledge and embodied (sensory) knowledge. Text-only LLMs, however, can represent only symbolic (conceptual) knowledge. Multimodal LLMs, on the other hand, are capable of representing the full range (conceptual and sensory) of human knowledge. We discuss conceptual understanding in visual-language LLMs, the most important multimodal LLMs, and major uses of them for CODL including concept extraction from image, concept graph extraction from image, and concept learning. While uses of LLMs for CODL are valuable standalone, they are particularly valuable as part of LLM applications such as AI chatbots

    Variational Quantum Kernels with Task-Specific Quantum Metric Learning

    Full text link
    Quantum kernel methods, i.e., kernel methods with quantum kernels, offer distinct advantages as a hybrid quantum-classical approach to quantum machine learning (QML), including applicability to Noisy Intermediate-Scale Quantum (NISQ) devices and usage for solving all types of machine learning problems. Kernel methods rely on the notion of similarity between points in a higher (possibly infinite) dimensional feature space. For machine learning, the notion of similarity assumes that points close in the feature space should be close in the machine learning task space. In this paper, we discuss the use of variational quantum kernels with task-specific quantum metric learning to generate optimal quantum embeddings (a.k.a. quantum feature encodings) that are specific to machine learning tasks. Such task-specific optimal quantum embeddings, implicitly supporting feature selection, are valuable not only to quantum kernel methods in improving the latter's performance, but they can also be valuable to non-kernel QML methods based on parameterized quantum circuits (PQCs) as pretrained embeddings and for transfer learning. This further demonstrates the quantum utility, and quantum advantage (with classically-intractable quantum embeddings), of quantum kernel methods
    • …
    corecore